Gibbs Paradox and the Concepts of Information, Symmetry, Similarity and Their Relationship
نویسنده
چکیده
We are publishing volume 10 of Entropy. When I was a chemistry student I was facinated by thermodynamic problems, particularly the Gibbs paradox. It has now been more than 10 years since I actively published on this topic [1-4]. During this decade, the globalized Information Society has been developing very quickly based on the Internet and the term “information” is widely used, but what is information? What is its relationship with entropy and other concepts like symmetry, distinguishability and stability? What is the situation of entropy research in general? As the Editor-in-Chief of Entropy, I feel it is time to offer some comments, present my own opinions in this matter and point out a major flaw in related studies.
منابع مشابه
Correlation of Entropy with Similarity and Symmetry
Informational entropy is quantitatively related to similarity and symmetry. Some tacit assumptions regarding their correlation have been shown to be wrong. The Gibbs paradox statement (indistinguishability corresponds to minimum entropy, which is zero) has been rejected. All their correlations are based on the relation that less information content corresponds to more entropy. Higher value of e...
متن کاملAsymmetry: The Foundation of Information. By Scott Muller. Springer: Berlin, 2007. VIII, 165 p, 33 illus., Hardcover. CHF 139.50. ISBN: 978-3-540-69883-8
Normally a religious book should be read at least 100 times; a philosophy book 10 times and a science monograph should be read carefully at least once before you can claim that you have read the book and understand something. However, with the information explosion, nowadays, people do not have enough time to read even one time from cover to cover, even for books prepared by philosophers. That ...
متن کاملInternational Journal of Molecular Science 2001, 2, 1-9
Symmetry is a measure of indistinguishability. Similarity is a continuous measure of imperfect symmetry. Lewis' remark that “gain of entropy means loss of information” defines the relationship of entropy and information. Three laws of information theory have been proposed. Labeling by introducing nonsymmetry and formatting by introducing symmetry are defined. The function L ( L=lnw, w is the nu...
متن کاملGibbs paradox of entropy of mixing: experimental facts, its rejection and the theoretical consequences
Gibbs paradox statement of entropy of mixing has been regarded as the theoretical foundation of statistical mechanics, quantum theory and biophysics. A large number of relevant chemical and physical observations show that the Gibbs paradox statement is false. We also disprove the Gibbs paradox statement through consideration of symmetry, similarity, entropy additivity and the defined property o...
متن کاملThe Paradox of Intervening in Complex Adaptive Systems; Comment on “Using Complexity and Network Concepts to Inform Healthcare Knowledge Translation”
This commentary addresses two points raised by Kitson and colleagues’ article. First, increasing interest in applying the Complexity Theory lens in healthcare needs further systematic work to create some commonality between concepts used. Second, our need to adopt a better understanding of how these systems organise so we can change the systems overall behaviour, creates a paradox. We seek to m...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Entropy
دوره 10 شماره
صفحات -
تاریخ انتشار 2008